Console Output

Training and evaluating model for: Other
Dataset length: 13590 windows


 NILMModel(
  (conv1d): Conv1d(9, 9, kernel_size=(3,), stride=(1,), padding=(1,))
  (lstm): LSTM(9, 256, num_layers=4, batch_first=True, dropout=0.1)
  (dropout): Dropout(p=0.1, inplace=False)
  (relu): ReLU()
  (output_layer): Linear(in_features=256, out_features=1, bias=True)
)
Epoch [1/300], Train Loss: 0.002586
Validation Loss: 0.000964
Epoch [2/300], Train Loss: 0.000972
Validation Loss: 0.000845
Epoch [3/300], Train Loss: 0.000865
Validation Loss: 0.000765
Epoch [4/300], Train Loss: 0.000805
Validation Loss: 0.000733
Epoch [5/300], Train Loss: 0.000778
Validation Loss: 0.000715
Epoch [6/300], Train Loss: 0.000755
Validation Loss: 0.000693
Epoch [7/300], Train Loss: 0.000740
Validation Loss: 0.000684
Epoch [8/300], Train Loss: 0.000722
Validation Loss: 0.000664
Epoch [9/300], Train Loss: 0.000704
Validation Loss: 0.000645
Epoch [10/300], Train Loss: 0.000686
Validation Loss: 0.000629
Epoch [11/300], Train Loss: 0.000666
Validation Loss: 0.000606
Epoch [12/300], Train Loss: 0.000643
Validation Loss: 0.000584
Epoch [13/300], Train Loss: 0.000611
Validation Loss: 0.000543
Epoch [14/300], Train Loss: 0.000565
Validation Loss: 0.000487
Epoch [15/300], Train Loss: 0.000505
Validation Loss: 0.000412
Epoch [16/300], Train Loss: 0.000422
Validation Loss: 0.000313
Epoch [17/300], Train Loss: 0.000310
Validation Loss: 0.000229
Epoch [18/300], Train Loss: 0.000240
Validation Loss: 0.000199
Epoch [19/300], Train Loss: 0.000209
Validation Loss: 0.000180
Epoch [20/300], Train Loss: 0.000187
Validation Loss: 0.000169
Epoch [21/300], Train Loss: 0.000173
Validation Loss: 0.000151
Epoch [22/300], Train Loss: 0.000162
Validation Loss: 0.000143
Epoch [23/300], Train Loss: 0.000153
Validation Loss: 0.000144
Epoch [24/300], Train Loss: 0.000148
Validation Loss: 0.000137
Epoch [25/300], Train Loss: 0.000143
Validation Loss: 0.000133
Epoch [26/300], Train Loss: 0.000141
Validation Loss: 0.000134
Epoch [27/300], Train Loss: 0.000139
Validation Loss: 0.000131
Epoch [28/300], Train Loss: 0.000136
Validation Loss: 0.000129
Epoch [29/300], Train Loss: 0.000134
Validation Loss: 0.000129
Epoch [30/300], Train Loss: 0.000133
Validation Loss: 0.000125
Epoch [31/300], Train Loss: 0.000131
Validation Loss: 0.000134
Epoch [32/300], Train Loss: 0.000130
Validation Loss: 0.000124
Epoch [33/300], Train Loss: 0.000128
Validation Loss: 0.000125
Epoch [34/300], Train Loss: 0.000128
Validation Loss: 0.000123
Epoch [35/300], Train Loss: 0.000127
Validation Loss: 0.000123
Epoch [36/300], Train Loss: 0.000126
Validation Loss: 0.000124
Epoch [37/300], Train Loss: 0.000126
Validation Loss: 0.000121
Epoch [38/300], Train Loss: 0.000125
Validation Loss: 0.000125
Epoch [39/300], Train Loss: 0.000129
Validation Loss: 0.000120
Epoch [40/300], Train Loss: 0.000124
Validation Loss: 0.000123
Epoch [41/300], Train Loss: 0.000123
Validation Loss: 0.000119
Epoch [42/300], Train Loss: 0.000122
Validation Loss: 0.000123
Epoch [43/300], Train Loss: 0.000122
Validation Loss: 0.000119
Epoch [44/300], Train Loss: 0.000121
Validation Loss: 0.000117
Epoch [45/300], Train Loss: 0.000121
Validation Loss: 0.000119
Epoch [46/300], Train Loss: 0.000120
Validation Loss: 0.000117
Epoch [47/300], Train Loss: 0.000121
Validation Loss: 0.000116
Epoch [48/300], Train Loss: 0.000119
Validation Loss: 0.000119
Epoch [49/300], Train Loss: 0.000119
Validation Loss: 0.000115
Epoch [50/300], Train Loss: 0.000119
Validation Loss: 0.000114
Epoch [51/300], Train Loss: 0.000117
Validation Loss: 0.000115
Epoch [52/300], Train Loss: 0.000117
Validation Loss: 0.000114
Epoch [53/300], Train Loss: 0.000117
Validation Loss: 0.000116
Epoch [54/300], Train Loss: 0.000116
Validation Loss: 0.000118
Epoch [55/300], Train Loss: 0.000115
Validation Loss: 0.000113
Epoch [56/300], Train Loss: 0.000115
Validation Loss: 0.000116
Epoch [57/300], Train Loss: 0.000115
Validation Loss: 0.000112
Epoch [58/300], Train Loss: 0.000115
Validation Loss: 0.000116
Epoch [59/300], Train Loss: 0.000114
Validation Loss: 0.000112
Epoch [60/300], Train Loss: 0.000114
Validation Loss: 0.000112
Epoch [61/300], Train Loss: 0.000112
Validation Loss: 0.000110
Epoch [62/300], Train Loss: 0.000113
Validation Loss: 0.000113
Epoch [63/300], Train Loss: 0.000112
Validation Loss: 0.000110
Epoch [64/300], Train Loss: 0.000112
Validation Loss: 0.000111
Epoch [65/300], Train Loss: 0.000112
Validation Loss: 0.000111
Epoch [66/300], Train Loss: 0.000111
Validation Loss: 0.000111
Epoch [67/300], Train Loss: 0.000111
Validation Loss: 0.000107
Epoch [68/300], Train Loss: 0.000111
Validation Loss: 0.000110
Epoch [69/300], Train Loss: 0.000110
Validation Loss: 0.000107
Epoch [70/300], Train Loss: 0.000111
Validation Loss: 0.000107
Epoch [71/300], Train Loss: 0.000109
Validation Loss: 0.000106
Epoch [72/300], Train Loss: 0.000110
Validation Loss: 0.000105
Epoch [73/300], Train Loss: 0.000109
Validation Loss: 0.000106
Epoch [74/300], Train Loss: 0.000108
Validation Loss: 0.000107
Epoch [75/300], Train Loss: 0.000108
Validation Loss: 0.000109
Epoch [76/300], Train Loss: 0.000107
Validation Loss: 0.000104
Epoch [77/300], Train Loss: 0.000107
Validation Loss: 0.000105
Epoch [78/300], Train Loss: 0.000107
Validation Loss: 0.000107
Epoch [79/300], Train Loss: 0.000107
Validation Loss: 0.000105
Epoch [80/300], Train Loss: 0.000106
Validation Loss: 0.000105
Epoch [81/300], Train Loss: 0.000106
Validation Loss: 0.000106
Epoch [82/300], Train Loss: 0.000105
Validation Loss: 0.000108
Epoch [83/300], Train Loss: 0.000105
Validation Loss: 0.000106
Epoch [84/300], Train Loss: 0.000105
Validation Loss: 0.000105
Epoch [85/300], Train Loss: 0.000105
Validation Loss: 0.000103
Epoch [86/300], Train Loss: 0.000106
Validation Loss: 0.000104
Epoch [87/300], Train Loss: 0.000104
Validation Loss: 0.000103
Epoch [88/300], Train Loss: 0.000105
Validation Loss: 0.000104
Epoch [89/300], Train Loss: 0.000103
Validation Loss: 0.000103
Epoch [90/300], Train Loss: 0.000104
Validation Loss: 0.000106
Epoch [91/300], Train Loss: 0.000103
Validation Loss: 0.000103
Epoch [92/300], Train Loss: 0.000103
Validation Loss: 0.000102
Epoch [93/300], Train Loss: 0.000103
Validation Loss: 0.000103
Epoch [94/300], Train Loss: 0.000102
Validation Loss: 0.000102
Epoch [95/300], Train Loss: 0.000102
Validation Loss: 0.000102
Epoch [96/300], Train Loss: 0.000103
Validation Loss: 0.000105
Epoch [97/300], Train Loss: 0.000102
Validation Loss: 0.000105
Epoch [98/300], Train Loss: 0.000102
Validation Loss: 0.000102
Epoch [99/300], Train Loss: 0.000102
Validation Loss: 0.000100
Epoch [100/300], Train Loss: 0.000102
Validation Loss: 0.000101
Epoch [101/300], Train Loss: 0.000101
Validation Loss: 0.000101
Epoch [102/300], Train Loss: 0.000101
Validation Loss: 0.000102
Epoch [103/300], Train Loss: 0.000101
Validation Loss: 0.000101
Epoch [104/300], Train Loss: 0.000102
Validation Loss: 0.000100
Epoch [105/300], Train Loss: 0.000100
Validation Loss: 0.000102
Epoch [106/300], Train Loss: 0.000100
Validation Loss: 0.000100
Epoch [107/300], Train Loss: 0.000100
Validation Loss: 0.000100
Epoch [108/300], Train Loss: 0.000100
Validation Loss: 0.000100
Epoch [109/300], Train Loss: 0.000100
Validation Loss: 0.000100
Epoch [110/300], Train Loss: 0.000099
Validation Loss: 0.000102
Epoch [111/300], Train Loss: 0.000100
Validation Loss: 0.000104
Epoch [112/300], Train Loss: 0.000100
Validation Loss: 0.000099
Epoch [113/300], Train Loss: 0.000100
Validation Loss: 0.000101
Epoch [114/300], Train Loss: 0.000099
Validation Loss: 0.000100
Epoch [115/300], Train Loss: 0.000099
Validation Loss: 0.000100
Epoch [116/300], Train Loss: 0.000099
Validation Loss: 0.000099
Epoch [117/300], Train Loss: 0.000099
Validation Loss: 0.000099
Epoch [118/300], Train Loss: 0.000099
Validation Loss: 0.000099
Epoch [119/300], Train Loss: 0.000098
Validation Loss: 0.000100
Epoch [120/300], Train Loss: 0.000098
Validation Loss: 0.000099
Epoch [121/300], Train Loss: 0.000099
Validation Loss: 0.000100
Epoch [122/300], Train Loss: 0.000098
Validation Loss: 0.000098
Epoch [123/300], Train Loss: 0.000098
Validation Loss: 0.000098
Epoch [124/300], Train Loss: 0.000098
Validation Loss: 0.000099
Epoch [125/300], Train Loss: 0.000097
Validation Loss: 0.000098
Epoch [126/300], Train Loss: 0.000098
Validation Loss: 0.000098
Epoch [127/300], Train Loss: 0.000098
Validation Loss: 0.000097
Epoch [128/300], Train Loss: 0.000096
Validation Loss: 0.000096
Epoch [129/300], Train Loss: 0.000097
Validation Loss: 0.000096
Epoch [130/300], Train Loss: 0.000097
Validation Loss: 0.000098
Epoch [131/300], Train Loss: 0.000097
Validation Loss: 0.000097
Epoch [132/300], Train Loss: 0.000096
Validation Loss: 0.000096
Epoch [133/300], Train Loss: 0.000096
Validation Loss: 0.000097
Epoch [134/300], Train Loss: 0.000096
Validation Loss: 0.000097
Epoch [135/300], Train Loss: 0.000096
Validation Loss: 0.000096
Epoch [136/300], Train Loss: 0.000095
Validation Loss: 0.000096
Epoch [137/300], Train Loss: 0.000095
Validation Loss: 0.000096
Epoch [138/300], Train Loss: 0.000095
Validation Loss: 0.000099
Epoch [139/300], Train Loss: 0.000095
Validation Loss: 0.000097
Epoch [140/300], Train Loss: 0.000094
Validation Loss: 0.000096
Epoch [141/300], Train Loss: 0.000096
Validation Loss: 0.000096
Epoch [142/300], Train Loss: 0.000094
Validation Loss: 0.000095
Epoch [143/300], Train Loss: 0.000095
Validation Loss: 0.000095
Epoch [144/300], Train Loss: 0.000094
Validation Loss: 0.000094
Epoch [145/300], Train Loss: 0.000094
Validation Loss: 0.000095
Epoch [146/300], Train Loss: 0.000094
Validation Loss: 0.000097
Epoch [147/300], Train Loss: 0.000095
Validation Loss: 0.000096
Epoch [148/300], Train Loss: 0.000094
Validation Loss: 0.000095
Epoch [149/300], Train Loss: 0.000094
Validation Loss: 0.000094
Epoch [150/300], Train Loss: 0.000094
Validation Loss: 0.000093
Epoch [151/300], Train Loss: 0.000094
Validation Loss: 0.000093
Epoch [152/300], Train Loss: 0.000094
Validation Loss: 0.000093
Epoch [153/300], Train Loss: 0.000093
Validation Loss: 0.000094
Epoch [154/300], Train Loss: 0.000094
Validation Loss: 0.000094
Epoch [155/300], Train Loss: 0.000093
Validation Loss: 0.000094
Epoch [156/300], Train Loss: 0.000093
Validation Loss: 0.000095
Epoch [157/300], Train Loss: 0.000093
Validation Loss: 0.000094
Epoch [158/300], Train Loss: 0.000093
Validation Loss: 0.000093
Epoch [159/300], Train Loss: 0.000093
Validation Loss: 0.000094
Epoch [160/300], Train Loss: 0.000092
Validation Loss: 0.000096
Epoch [161/300], Train Loss: 0.000093
Validation Loss: 0.000093
Epoch [162/300], Train Loss: 0.000093
Validation Loss: 0.000093
Epoch [163/300], Train Loss: 0.000092
Validation Loss: 0.000093
Epoch [164/300], Train Loss: 0.000092
Validation Loss: 0.000092
Epoch [165/300], Train Loss: 0.000092
Validation Loss: 0.000097
Epoch [166/300], Train Loss: 0.000092
Validation Loss: 0.000091
Epoch [167/300], Train Loss: 0.000092
Validation Loss: 0.000092
Epoch [168/300], Train Loss: 0.000092
Validation Loss: 0.000092
Epoch [169/300], Train Loss: 0.000091
Validation Loss: 0.000091
Epoch [170/300], Train Loss: 0.000091
Validation Loss: 0.000091
Epoch [171/300], Train Loss: 0.000091
Validation Loss: 0.000092
Epoch [172/300], Train Loss: 0.000091
Validation Loss: 0.000090
Epoch [173/300], Train Loss: 0.000091
Validation Loss: 0.000091
Epoch [174/300], Train Loss: 0.000092
Validation Loss: 0.000090
Epoch [175/300], Train Loss: 0.000091
Validation Loss: 0.000091
Epoch [176/300], Train Loss: 0.000091
Validation Loss: 0.000089
Epoch [177/300], Train Loss: 0.000090
Validation Loss: 0.000094
Epoch [178/300], Train Loss: 0.000090
Validation Loss: 0.000092
Epoch [179/300], Train Loss: 0.000090
Validation Loss: 0.000092
Epoch [180/300], Train Loss: 0.000090
Validation Loss: 0.000089
Epoch [181/300], Train Loss: 0.000090
Validation Loss: 0.000092
Epoch [182/300], Train Loss: 0.000090
Validation Loss: 0.000092
Epoch [183/300], Train Loss: 0.000090
Validation Loss: 0.000090
Epoch [184/300], Train Loss: 0.000090
Validation Loss: 0.000089
Epoch [185/300], Train Loss: 0.000090
Validation Loss: 0.000089
Epoch [186/300], Train Loss: 0.000090
Validation Loss: 0.000089
Epoch [187/300], Train Loss: 0.000090
Validation Loss: 0.000089
Epoch [188/300], Train Loss: 0.000089
Validation Loss: 0.000089
Epoch [189/300], Train Loss: 0.000089
Validation Loss: 0.000090
Epoch [190/300], Train Loss: 0.000089
Validation Loss: 0.000089
Epoch [191/300], Train Loss: 0.000089
Validation Loss: 0.000092
Epoch [192/300], Train Loss: 0.000089
Validation Loss: 0.000089
Epoch [193/300], Train Loss: 0.000090
Validation Loss: 0.000090
Epoch [194/300], Train Loss: 0.000089
Validation Loss: 0.000091
Epoch [195/300], Train Loss: 0.000088
Validation Loss: 0.000088
Epoch [196/300], Train Loss: 0.000088
Validation Loss: 0.000088
Epoch [197/300], Train Loss: 0.000089
Validation Loss: 0.000087
Epoch [198/300], Train Loss: 0.000088
Validation Loss: 0.000090
Epoch [199/300], Train Loss: 0.000089
Validation Loss: 0.000088
Epoch [200/300], Train Loss: 0.000088
Validation Loss: 0.000088
Epoch [201/300], Train Loss: 0.000088
Validation Loss: 0.000087
Epoch [202/300], Train Loss: 0.000088
Validation Loss: 0.000087
Epoch [203/300], Train Loss: 0.000088
Validation Loss: 0.000087
Epoch [204/300], Train Loss: 0.000088
Validation Loss: 0.000090
Epoch [205/300], Train Loss: 0.000088
Validation Loss: 0.000088
Epoch [206/300], Train Loss: 0.000088
Validation Loss: 0.000086
Epoch [207/300], Train Loss: 0.000087
Validation Loss: 0.000087
Epoch [208/300], Train Loss: 0.000087
Validation Loss: 0.000087
Epoch [209/300], Train Loss: 0.000088
Validation Loss: 0.000086
Epoch [210/300], Train Loss: 0.000088
Validation Loss: 0.000086
Epoch [211/300], Train Loss: 0.000087
Validation Loss: 0.000086
Epoch [212/300], Train Loss: 0.000087
Validation Loss: 0.000087
Epoch [213/300], Train Loss: 0.000087
Validation Loss: 0.000092
Epoch [214/300], Train Loss: 0.000087
Validation Loss: 0.000088
Epoch [215/300], Train Loss: 0.000087
Validation Loss: 0.000087
Epoch [216/300], Train Loss: 0.000087
Validation Loss: 0.000087
Epoch [217/300], Train Loss: 0.000086
Validation Loss: 0.000086
Epoch [218/300], Train Loss: 0.000086
Validation Loss: 0.000092
Epoch [219/300], Train Loss: 0.000087
Validation Loss: 0.000086
Epoch [220/300], Train Loss: 0.000086
Validation Loss: 0.000090
Epoch [221/300], Train Loss: 0.000086
Validation Loss: 0.000087
Epoch [222/300], Train Loss: 0.000086
Validation Loss: 0.000086
Epoch [223/300], Train Loss: 0.000086
Validation Loss: 0.000085
Epoch [224/300], Train Loss: 0.000086
Validation Loss: 0.000094
Epoch [225/300], Train Loss: 0.000087
Validation Loss: 0.000086
Epoch [226/300], Train Loss: 0.000086
Validation Loss: 0.000086
Epoch [227/300], Train Loss: 0.000085
Validation Loss: 0.000087
Epoch [228/300], Train Loss: 0.000085
Validation Loss: 0.000087
Epoch [229/300], Train Loss: 0.000086
Validation Loss: 0.000085
Epoch [230/300], Train Loss: 0.000085
Validation Loss: 0.000085
Epoch [231/300], Train Loss: 0.000085
Validation Loss: 0.000085
Epoch [232/300], Train Loss: 0.000086
Validation Loss: 0.000091
Epoch [233/300], Train Loss: 0.000085
Validation Loss: 0.000090
Epoch [234/300], Train Loss: 0.000085
Validation Loss: 0.000084
Epoch [235/300], Train Loss: 0.000085
Validation Loss: 0.000086
Epoch [236/300], Train Loss: 0.000085
Validation Loss: 0.000087
Epoch [237/300], Train Loss: 0.000086
Validation Loss: 0.000085
Epoch [238/300], Train Loss: 0.000085
Validation Loss: 0.000086
Epoch [239/300], Train Loss: 0.000085
Validation Loss: 0.000084
Epoch [240/300], Train Loss: 0.000085
Validation Loss: 0.000084
Epoch [241/300], Train Loss: 0.000085
Validation Loss: 0.000086
Epoch [242/300], Train Loss: 0.000085
Validation Loss: 0.000084
Epoch [243/300], Train Loss: 0.000085
Validation Loss: 0.000087
Epoch [244/300], Train Loss: 0.000085
Validation Loss: 0.000085
Early stopping triggered

Evaluating model for: Other
Validation MAE: 63.005836 W
Validation MSE: 17196.005859 W²
Validation RMSE: 131.133545 W
Signal Aggregate Error (SAE): 0.000384
Normalized Disaggregation Error (NDE): 0.137483

      

Training and Validation Loss

Training Loss Plot

Interactive Plot